22 research outputs found

    Symmetry Breaking with Polynomial Delay

    Full text link
    A conservative class of constraint satisfaction problems CSPs is a class for which membership is preserved under arbitrary domain reductions. Many well-known tractable classes of CSPs are conservative. It is well known that lexleader constraints may significantly reduce the number of solutions by excluding symmetric solutions of CSPs. We show that adding certain lexleader constraints to any instance of any conservative class of CSPs still allows us to find all solutions with a time which is polynomial between successive solutions. The time is polynomial in the total size of the instance and the additional lexleader constraints. It is well known that for complete symmetry breaking one may need an exponential number of lexleader constraints. However, in practice, the number of additional lexleader constraints is typically polynomial number in the size of the instance. For polynomially many lexleader constraints, we may in general not have complete symmetry breaking but polynomially many lexleader constraints may provide practically useful symmetry breaking -- and they sometimes exclude super-exponentially many solutions. We prove that for any instance from a conservative class, the time between finding successive solutions of the instance with polynomially many additional lexleader constraints is polynomial even in the size of the instance without lexleaderconstraints

    Intermittent Demand Forecasting with Deep Renewal Processes

    Full text link
    Intermittent demand, where demand occurrences appear sporadically in time, is a common and challenging problem in forecasting. In this paper, we first make the connections between renewal processes, and a collection of current models used for intermittent demand forecasting. We then develop a set of models that benefit from recurrent neural networks to parameterize conditional interdemand time and size distributions, building on the latest paradigm in "deep" temporal point processes. We present favorable empirical findings on discrete and continuous time intermittent demand data, validating the practical value of our approach.Comment: NeurIPS 2019 Workshop on Temporal Point Processe

    Approximate Bayesian Inference in Linear State Space Models for Intermittent Demand Forecasting at Scale

    Full text link
    We present a scalable and robust Bayesian inference method for linear state space models. The method is applied to demand forecasting in the context of a large e-commerce platform, paying special attention to intermittent and bursty target statistics. Inference is approximated by the Newton-Raphson algorithm, reduced to linear-time Kalman smoothing, which allows us to operate on several orders of magnitude larger problems than previous related work. In a study on large real-world sales datasets, our method outperforms competing approaches on fast and medium moving items

    Deep Factors for Forecasting

    Full text link
    Producing probabilistic forecasts for large collections of similar and/or dependent time series is a practically relevant and challenging task. Classical time series models fail to capture complex patterns in the data, and multivariate techniques struggle to scale to large problem sizes. Their reliance on strong structural assumptions makes them data-efficient, and allows them to provide uncertainty estimates. The converse is true for models based on deep neural networks, which can learn complex patterns and dependencies given enough data. In this paper, we propose a hybrid model that incorporates the benefits of both approaches. Our new method is data-driven and scalable via a latent, global, deep component. It also handles uncertainty through a local classical model. We provide both theoretical and empirical evidence for the soundness of our approach through a necessary and sufficient decomposition of exchangeable time series into a global and a local part. Our experiments demonstrate the advantages of our model both in term of data efficiency, accuracy and computational complexity.Comment: http://proceedings.mlr.press/v97/wang19k/wang19k.pdf. arXiv admin note: substantial text overlap with arXiv:1812.0009

    Anomaly Detection at Scale: The Case for Deep Distributional Time Series Models

    Full text link
    This paper introduces a new methodology for detecting anomalies in time series data, with a primary application to monitoring the health of (micro-) services and cloud resources. The main novelty in our approach is that instead of modeling time series consisting of real values or vectors of real values, we model time series of probability distributions over real values (or vectors). This extension to time series of probability distributions allows the technique to be applied to the common scenario where the data is generated by requests coming in to a service, which is then aggregated at a fixed temporal frequency. Our method is amenable to streaming anomaly detection and scales to monitoring for anomalies on millions of time series. We show the superior accuracy of our method on synthetic and public real-world data. On the Yahoo Webscope data set, we outperform the state of the art in 3 out of 4 data sets and we show that we outperform popular open-source anomaly detection tools by up to 17% average improvement for a real-world data set

    Neural Temporal Point Processes: A Review

    Full text link
    Temporal point processes (TPP) are probabilistic generative models for continuous-time event sequences. Neural TPPs combine the fundamental ideas from point process literature with deep learning approaches, thus enabling construction of flexible and efficient models. The topic of neural TPPs has attracted significant attention in the recent years, leading to the development of numerous new architectures and applications for this class of models. In this review paper we aim to consolidate the existing body of knowledge on neural TPPs. Specifically, we focus on important design choices and general principles for defining neural TPP models. Next, we provide an overview of application areas commonly considered in the literature. We conclude this survey with the list of open challenges and important directions for future work in the field of neural TPPs

    A simple and effective predictive resource scaling heuristic for large-scale cloud applications

    Full text link
    We propose a simple yet effective policy for the predictive auto-scaling of horizontally scalable applications running in cloud environments, where compute resources can only be added with a delay, and where the deployment throughput is limited. Our policy uses a probabilistic forecast of the workload to make scaling decisions dependent on the risk aversion of the application owner. We show in our experiments using real-world and synthetic data that this policy compares favorably to mathematically more sophisticated approaches as well as to simple benchmark policies

    The Effectiveness of Discretization in Forecasting: An Empirical Study on Neural Time Series Models

    Full text link
    Time series modeling techniques based on deep learning have seen many advancements in recent years, especially in data-abundant settings and with the central aim of learning global models that can extract patterns across multiple time series. While the crucial importance of appropriate data pre-processing and scaling has often been noted in prior work, most studies focus on improving model architectures. In this paper we empirically investigate the effect of data input and output transformations on the predictive performance of several neural forecasting architectures. In particular, we investigate the effectiveness of several forms of data binning, i.e. converting real-valued time series into categorical ones, when combined with feed-forward, recurrent neural networks, and convolution-based sequence models. In many non-forecasting applications where these models have been very successful, the model inputs and outputs are categorical (e.g. words from a fixed vocabulary in natural language processing applications or quantized pixel color intensities in computer vision). For forecasting applications, where the time series are typically real-valued, various ad-hoc data transformations have been proposed, but have not been systematically compared. To remedy this, we evaluate the forecasting accuracy of instances of the aforementioned model classes when combined with different types of data scaling and binning. We find that binning almost always improves performance (compared to using normalized real-valued inputs), but that the particular type of binning chosen is of lesser importance

    Intermittent Demand Forecasting with Renewal Processes

    Full text link
    Intermittency is a common and challenging problem in demand forecasting. We introduce a new, unified framework for building intermittent demand forecasting models, which incorporates and allows to generalize existing methods in several directions. Our framework is based on extensions of well-established model-based methods to discrete-time renewal processes, which can parsimoniously account for patterns such as aging, clustering and quasi-periodicity in demand arrivals. The connection to discrete-time renewal processes allows not only for a principled extension of Croston-type models, but also for an natural inclusion of neural network based models---by replacing exponential smoothing with a recurrent neural network. We also demonstrate that modeling continuous-time demand arrivals, i.e., with a temporal point process, is possible via a trivial extension of our framework. This leads to more flexible modeling in scenarios where data of individual purchase orders are directly available with granular timestamps. Complementing this theoretical advancement, we demonstrate the efficacy of our framework for forecasting practice via an extensive empirical study on standard intermittent demand data sets, in which we report predictive accuracy in a variety of scenarios that compares favorably to the state of the art

    GluonTS: Probabilistic Time Series Models in Python

    Full text link
    We introduce Gluon Time Series (GluonTS, available at https://gluon-ts.mxnet.io), a library for deep-learning-based time series modeling. GluonTS simplifies the development of and experimentation with time series models for common tasks such as forecasting or anomaly detection. It provides all necessary components and tools that scientists need for quickly building new models, for efficiently running and analyzing experiments and for evaluating model accuracy.Comment: ICML Time Series Workshop 201
    corecore